|
In electrical engineering, computer science and information theory, channel capacity is the tight upper bound on the rate at which information can be reliably transmitted over a communications channel. By the noisy-channel coding theorem, the channel capacity of a given channel is the limiting information rate (in units of information per unit time) that can be achieved with arbitrarily small error probability.〔(【引用サイトリンク】 author = Saleem Bhatti )〕〔(【引用サイトリンク】 Signals look like noise! )〕 Information theory, developed by Claude E. Shannon during World War II, defines the notion of channel capacity and provides a mathematical model by which one can compute it. The key result states that the capacity of the channel, as defined above, is given by the maximum of the mutual information between the input and output of the channel, where the maximization is with respect to the input distribution. ==Formal definition== Let and be the random variables representing the input and output of the channel, respectively. Let be the conditional distribution function of given , which is an inherent fixed property of the communications channel. Then the choice of the marginal distribution completely determines the joint distribution due to the identity : which, in turn, induces a mutual information . The channel capacity is defined as : where the supremum is taken over all possible choices of . 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「channel capacity」の詳細全文を読む スポンサード リンク
|